87 research outputs found

    Integer linear programming vs. graph-based methods in code generation

    Get PDF
    A common characterictic of many applications is that they are aimed at the high-volume consumer market, which is extremely cost-sensitive. However many of them impose stringent performance demands on the underlying system. Therefore the code generation must take into account the restrictions and features given by the target architecture while satisfying these performance demands. High-level language compilers often are unable to generate code meeting these requirements. One reason is the phase coupling problem between instruction scheduling and register allocation. Many compilers perform these tasks separately with each phase ignorant of the require- ments of the other. Commonly, each task is accomplished by using heuristic methods. As the goals of the two phases often conflict, whichever phase is performed first imposes constraints on the other, sometimes producing inefficient code. Integer linear programming (ILP) provides an integrated approach to the combined instruction scheduling and register allocation problem. This way, optimal solutions can be found - albeit at the cost of high compilation times. In our experiments, we considered as target processor the 32-bit DSP ADSP-2106x. We have examined two different ILP formulations and compared them with conventional approaches including list scheduling and the critical path method. Moreover, we have investigated approximations based on the ILP formulations; this way, compilation time can be reduced considerably while still producing near-optimal results. From the results of our implementation, we have concluded that integrating ILP formulations in conventional global algorithms is a promising method for generating high-quality code

    Integer linear programming vs. graph-based methods in code generation

    Get PDF
    A common characterictic of many applications is that they are aimed at the high-volume consumer market, which is extremely cost-sensitive. However many of them impose stringent performance demands on the underlying system. Therefore the code generation must take into account the restrictions and features given by the target architecture while satisfying these performance demands. High-level language compilers often are unable to generate code meeting these requirements. One reason is the phase coupling problem between instruction scheduling and register allocation. Many compilers perform these tasks separately with each phase ignorant of the require- ments of the other. Commonly, each task is accomplished by using heuristic methods. As the goals of the two phases often conflict, whichever phase is performed first imposes constraints on the other, sometimes producing inefficient code. Integer linear programming (ILP) provides an integrated approach to the combined instruction scheduling and register allocation problem. This way, optimal solutions can be found - albeit at the cost of high compilation times. In our experiments, we considered as target processor the 32-bit DSP ADSP-2106x. We have examined two different ILP formulations and compared them with conventional approaches including list scheduling and the critical path method. Moreover, we have investigated approximations based on the ILP formulations; this way, compilation time can be reduced considerably while still producing near-optimal results. From the results of our implementation, we have concluded that integrating ILP formulations in conventional global algorithms is a promising method for generating high-quality code

    CompCert - A Formally Verified Optimizing Compiler

    Get PDF
    International audienceCompCert is the first commercially available optimizing compiler that is formally verified, using machine-assisted mathematical proofs, to be exempt from mis-compilation. The executable code it produces is proved to behave exactly as specified by the semantics of the source C program. This article gives an overview of the design of CompCert and its proof concept and then focuses on aspects relevant for industrial application. We briefly summarize practical experience and give an overview of recent CompCert development aiming at industrial usage. CompCert's intended use is the compilation of life-critical and mission-critical software meeting high levels of assurance. In this context tool qualification is of paramount importance. We summarize the confidence argument of CompCert and give an overview of relevant qualification strategies

    Reconstructing microstructures from statistical descriptors using neural cellular automata

    Full text link
    The problem of generating microstructures of complex materials in silico has been approached from various directions including simulation, Markov, deep learning and descriptor-based approaches. This work presents a hybrid method that is inspired by all four categories and has interesting scalability properties. A neural cellular automaton is trained to evolve microstructures based on local information. Unlike most machine learning-based approaches, it does not directly require a data set of reference micrographs, but is trained from statistical microstructure descriptors that can stem from a single reference. This means that the training cost scales only with the complexity of the structure and associated descriptors. Since the size of the reconstructed structures can be set during inference, even extremely large structures can be efficiently generated. Similarly, the method is very efficient if many structures are to be reconstructed from the same descriptor for statistical evaluations. The method is formulated and discussed in detail by means of various numerical experiments, demonstrating its utility and scalability

    What Do We Want From Explainable Artificial Intelligence (XAI)? -- A Stakeholder Perspective on XAI and a Conceptual Model Guiding Interdisciplinary XAI Research

    Get PDF
    Previous research in Explainable Artificial Intelligence (XAI) suggests that a main aim of explainability approaches is to satisfy specific interests, goals, expectations, needs, and demands regarding artificial systems (we call these stakeholders' desiderata) in a variety of contexts. However, the literature on XAI is vast, spreads out across multiple largely disconnected disciplines, and it often remains unclear how explainability approaches are supposed to achieve the goal of satisfying stakeholders' desiderata. This paper discusses the main classes of stakeholders calling for explainability of artificial systems and reviews their desiderata. We provide a model that explicitly spells out the main concepts and relations necessary to consider and investigate when evaluating, adjusting, choosing, and developing explainability approaches that aim to satisfy stakeholders' desiderata. This model can serve researchers from the variety of different disciplines involved in XAI as a common ground. It emphasizes where there is interdisciplinary potential in the evaluation and the development of explainability approaches.Comment: 57 pages, 2 figures, 1 table, to be published in Artificial Intelligence, Markus Langer, Daniel Oster and Timo Speith share first-authorship of this pape

    Revelation of interfacial energetics in organic multiheterojunctions

    Get PDF
    Efficient charge generation via exciton dissociation in organic bulk heterojunctions necessitates donor–acceptor interfaces, e.g., between a conjugated polymer and a fullerene derivative. Furthermore, aggregation and corresponding structural order of polymer and fullerene domains result in energetic relaxations of molecular energy levels toward smaller energy gaps as compared to the situation for amorphous phases existing in homogeneously intermixed polymer:fullerene blends. Here it is shown that these molecular energy level shifts are reflected in interfacial charge transfer (CT) transitions and depending on the existence of disordered or ordered interfacial domains. It can be done so by systematically controlling the order at the donor–acceptor interface via ternary blending of semicrystalline and amorphous model polymers with a fullerene acceptor. These variations in interfacial domain order are probed with luminescence spectroscopy, yielding various transition energies due to activation of different recombination channels at the interface. Finally, it is shown that via this analysis the energy landscape at the organic heterojunction interface can be obtained

    Genetically Encoded Voltage Indicators in Circulation Research

    Get PDF
    Membrane potentials display the cellular status of non-excitable cells and mediate communication between excitable cells via action potentials. The use of genetically encoded biosensors employing fluorescent proteins allows a non-invasive biocompatible way to read out the membrane potential in cardiac myocytes and other cells of the circulation system. Although the approaches to design such biosensors date back to the time when the first fluorescent-protein based Förster Resonance Energy Transfer (FRET) sensors were constructed, it took 15 years before reliable sensors became readily available. Here, we review different developments of genetically encoded membrane potential sensors. Furthermore, it is shown how such sensors can be used in pharmacological screening applications as well as in circulation related basic biomedical research. Potentials and limitations will be discussed and perspectives of possible future developments will be provided

    Recent developments in the characterization of superconducting films by microwaves

    Full text link
    We describe and analyze selected surface impedance data recently obtained by different groups on cuprate, ruthenate and diboride superconducting films on metallic and dielectric substrates for fundamental studies and microwave applications. The discussion includes a first review of microwave data on MgB2, the weak-link behaviour of RABiTS-type YBa2Cu3O7-d tapes, and the observation of a strong anomalous power-dependence of the microwave losses in MgO at low temperatures. We demonstrate how microwave measurements can be used to investigate electronic, magnetic, and dielectric dissipation and relaxation in the films and substrates. The impact of such studies reaches from the extraction of microscopic information to the engineering of materials and further on to applications in power systems and communication technology.Comment: Invited contribution to EUCAS2001, accepted for publication in Physica C in its present for
    • …
    corecore